NNDT Neural Network Development Tool Evaluation version 1.1 Björn Saxen 1995 List of changes and (some of the) bug corrections 1.0 -> 1.1 ============== Changes: - Command line arguments enabling non-interavtive runs supported - The pattern file may have up to 35 columns - Max number of nodes/layer increased to 15, max number of weights increased to 200 - User's guide is split into three separate ps-files, since there seems to be problems with printing large ps-files on some systems. All three parts are packed into MANUAL.ZIP - "Always reread pattern file" check box added -> changes made to the pattern file contents between subsequent runs take effect. - Multiple sessions of NNDT possible - A log file option added - The seed for the random number generator is saved in the setup file, the seed is not re-generated when "initialise" window is opened. - The default value for the relative weight change limit is set to 100%, since this is often a suitable value. - The default directory is changed to that of the latest file selected in any of the file dialog boxes. Fixed bugs: - A GPF occured if memory allocation failed in NNDTCALC.DLL - An extra line sometimes occured in training progress graph. 0.99 -> 1.0 ============== Changes: - A user's guide is delivered with NNDT (MANUAL.ZIP) - The network calculations are carried out by NNDTCALC.DLL, situated in the directory created for NNDT. (NETSOLV.DLL used by older versions of NNDT can be removed from the \system directory) Fixed bugs: - An error occured in "training progress" plot occured if the number of iterations exceeded ~6000. - The axis labels in "performance graph" was drawn in a strange way for some min and max values. - Initial state table and network activation table (in network state window) were not completely shown for certain network configurations. 0.98d -> 0.99 ============== Changes: - Modified installation routine. Manual setup is made easy since the program files are not compressed. Fixed bugs: - Certain large data sets and/or networks caused a GPF in the network calculation routine (netsolv.dll) 0.98c -> 0.98d ============== Changes: - Faster network training due to improvements in the algorithm and optimized compiling - Weight initialisation can be made separately for each layer and the seed for the random number generator can be specified by the user. - Better plot possibilities - Node activations for test data can be analysed in state window - Recurrent networks without input signals can be evaluated with a number of patterns different to that of the pattern file - A demo setup (demo1b.mlp) for an autonomous recurrent network added - Reduced memory requirements at startup Fixed bugs: - The node to feed back could not be selected when only one feedback connection was chosen - The analytical gradient (derivatives) was calculated incorrectly for recurrent nets when several periods with equal initial states were used. - The use of the word "parameter" in the help file etc. was somewhat inconsequent. From now on, a parameter is a network weight, bias or initial state. - Still some problems with a decimal separator other than the dot (.) - Test file activations were not calculated if the number of observations in the test file was less than required for network training. - Graphs were not updated for the final trained net when training converged between redraws (redraw interval > 1) - Residuals for test data were not plotted correctly Thanks to Mats and Henrik for testing! 0.98b -> 0.98c ============== Changes: - new options for the graph: internal node activations and weights can be plotted, also test data can be plotted - the graph is updated immediately after changing plot variables or options - Buttons added to the main window for faster access to other windows. Fixed bugs: - Real time mode did not work properly -> this option removed! - rms error for test patterns at iteration 0 was plotted incorrectly - error in screen updating after use of single step mode - some parameters were not initialized when a new setup was chosen 0.98 -> 0.98b ============= Fixed bugs: - An error in the setup save routine caused problems when "old" weights were used. - No network evaluation was made before training which caused wrong value for "iteration zero" in the training progress plot. 0.96 -> 0.98 =========== Changes: - Analytical derivatives available -> better training performance - Teacher forcing option fot recurrent nets - New weight constraints: penalty term & max. rel. change - Specification of constant and equal weights possible - A separate test file can be used - Time, iteration index, SSQ and rms error are written to a log table after each iteration - A plot of rms error for training and test data vs iteration index is available Fixed bugs: - Using a decimal separator other than the dot (.) as default in the MS Windows environment caused problems, hopefully not any more 0.9 -> 0.96 =========== Changes: - Menus and file selection restructured to a more "standard" form. - "New setup" option added - Pattern file setup made clearer (hopefully). - OK button in about-box when chosen from menu. - Faster aborting when stop button pressed during iteration. - Default redraw interval set to 1, this also limits the training to *one* step at a time in "single step mode". - Better possibilities to examine node activations for separate patterns, both during iteration steps and after iteration. - Button to verify weight changes made in network state window added. - Residuals, i.e the difference between network and desired outputs, can be plotted as an alternative to the outputs. - The plot "density" can be increased to make faster (but less accurate) plot - Check box for pattern data filtering - Network parameter window accessed by double-clicking the picture Fixed bugs: - Some user actions in network state window caused illegal function calls. - Only one header line was read from the pattern file, even if a greater number was specified. - Some missing lines in the setup file DEMO1.MLP caused errors when DEMO1 opened after a setup with feed-back nodes. - Multiple instances of running NNDT:s cause GPF:s in the training routines (NETSOLV.DLL). Now, only one NNDT can call the DLL at a time. - Decreasing the number of variables to be read from pattern file did not always work - Specification of the feed-back nodes was not always easy. - + some other bugs which I don´t remember...